18 research outputs found

    Fuzzy to Random Uncertainty Alignment

    Get PDF
    The objective of this paper is to present new and simple mathematical approach to deal with uncertainty alignment between fuzzy and random data. In particular we present a method to describe fuzzy (possibilistic) distribution in terms of a pair (or more) of related random (probabilistic) events, both fixed and variable. Our approach uses basic properties of both fuzzy and random distributions. We show that the data fuzziness can be viewed as a non uniqueness of related random events. We also show how fuzzy-random consistancy principle can be given precise mathemtaical meaning. Various types of fuzzy distributions are examined, special cases considered, and several numerical examples presented

    Title Grey Predictor Reference Model For Assisting Particle Swarm Optimization

    Get PDF
    This paper proposes an approach of forming the average performance by Grey Modeling, GM, and use an average performance as reference model for doing evolutionary computation with error type performance index. The idea of the approach is to construct the reference model based on the performance of unknown systems when users apply evolutionary computation to fine-tuning the control systems with error type performance index. We apply this approach to particle swarm optimization for searching the optimal gains of baseline PI controller of wind turbines operating at the certain set point in Region 3. In the numerical simulation part, the corresponding results demonstrate the effectiveness of Grey Modeling

    Some Extensions to Classic Lotka-Volterra Modeling For Predator Prey Applications

    Get PDF
    In this paper we present some specific cases of the classic Nonlinear Lotka-Volterra (NLV) approach to modeling predator-prey dynamic systems [1,5], and propose to implement them using "mathematical" (Matlab) approach as well as "ad-hoc" approach using Agent Based Modeling (implemented using NetLogo modeling environment), [6].   Examples of various scenarios are introduced in a gradual way, from simpler to more complex ones. The emphasis is given to gaining insight into predator-prey relationship, as well as some structural results [2,3] as applied to classic complex systems modeling and control, as well as understanding stability in multispecies communities. The paper sets the scene for further research using NLV (mathematical) and ABM (ad-hoc) models. With this "parallel" approach we hope to address some classic problems such as Gause's Law and Paradox of the Plankton,  Paradox of Enrichment (system level instability), Oksanen's description and trophic level numbers,  and other current Complex Systems paradigms such as adaptivity, emergence, etc.

    New Pose Estimation Methodology for Target Tracking and Identification

    Get PDF
    Ground Moving Target Indicator (GMTI) and High Resolution Radar (HRR) can track position and velocity of ground moving target. Pose, angle between position and velocity, can be derived from kinematics estimates of position and velocity and it is often used to reduce the search space of a target identification (ID) and Automatic Target Recognition (ATR)  algorithms. Due to low resolution in some radar systems, the GMTI estimated pose may exhibit large errors contributing to a faulty identification of potential targets. Our goal is to define new methodology to improve pose estimate. Besides applications in target tracking, there are numerous commercial applications in machine learning, augmented reality and body tracking

    Internet of Things: Current Technological Review and New Low Power Wireless Sensor Network Protocol Proposal

    Get PDF
    This paper addresses Internet of Things (IoT) with state-of-art approach. The purpose is to give insight into concept of “smart living”, a concept that meets requirements of today’s modern society. Implementation of this new technology requires new hardware and software installed and run on devices (“things”) connected to the Internet anytime and anywhere. In order to make possible this new technology for wide use, few technological, standards and legal issues need to be solved. In a view of this a new low power wireless sensor network protocol is proposed in the IoT spirit

    Uncertainty balance principle

    Get PDF
    The objective of this paper is to present new and simple mathematical approach to deal with uncertainty transformation for fuzzy to random or random to fuzzy data. In particular we present a method to describe fuzzy (possibilistic) distribution in terms of a pair (or more) of related random (probabilistic) events, both fixed and variable. Our approach uses basic properties of both fuzzy and random distributions, and it assumes data is both possibilistic and probabilistic. We show that the data fuzziness can be viewed as a non uniqueness of related random events, and prove our Uncertainty Balance Principle. We also show how Zadeh’s fuzzy-random Consistency Principle can be given precise mathematical meaning. Various types of fuzzy distributions are examined and several numerical examples presented

    Testing the hypothesis that variations in atmospheric water vapour are the main cause of fluctuations in global temperature

    Get PDF
    A hypothesis that the increasing application of both surface and ground water for irrigation of crops is a significant source of anthropogenic global warming is tested. In climate models, water is already assigned a major secondary amplifying role in warming, solely as a positive feedback from an atmosphere previously warmed by other GHGs. However, this conclusion ignores the direct anthropogenic forcing from increasing use of water in dry regions to grow crops for the human population. The area irrigated worldwide increased by around 1.5% annually between 1960 and 2000, almost trebling in magnitude. Importantly, though only a small proportion of the Earth’s surface, this additional water vapour is dynamically focussed on dry land, intensifying its potential to elevate the troposphere and reduce the regional OLR. Our modelling analysis suggests that the increase in atmospheric water vapour from irrigation could be significantly more than 1% by 2050 compared to 1950, imposing a global forcing exceeding 1.0 W/m2. Fortunately, this hypothesis can be tested, for example, using the satellite data on OLR acquired since the 1970s, relating this to local trends of increasing irrigation or major floods in arid regions. If found consistent with the data, current proposals to mitigate climate change by limiting combustion of fossil fuels may prove less effective. This prediction regarding the warming effect of increasing irrigation is tested using NCAR reanalysis data made possible by the natural experiments of the periodic flooding of Lake Eyre in Australia's semi-arid interior. It is recommended that this hypothesis be tested using data from local studies in irrigated regions such as changes in outgoing longwave radiation and in increased absorption of incoming shortwave radiation in air

    Kalman Filter Harmonic Bank for Vostok Ice Core Data Analysis and Climate Predictions

    Get PDF
    The Vostok ice core data cover 420,000 years indicating the natural regularity of Earth’s surface temperature and climate. Here, we consider four major cycles of similar duration, ranging from 86,000 to 128,000 years, comprising 15% of periods for the warming interglacials compared to some 85% of cooling periods. Globally, we are near the peak of a rapid warming period. We perform a detailed frequency analysis of temperature and CO2 cycles, as a primary stage in building a logical Climate Prediction Engine (CPE), illustrated with specific harmonics. This analysis can be repeated for all harmonics and various cycle combinations. Our time correlation estimates the CO2 time lag for temperature at 400–2300 years, depending on the cycle, longer on average than previously concluded. We also perform Fast-Fourier transform analysis, identifying a full harmonic spectrum for each cycle, plus an energy analysis to identify each harmonic amplitude − to achieve further prediction analysis using a Kalman filter harmonic bank. For Vostok data we can use combinations of different cycles compared to the most recent for learning and then the current ongoing cycle for testing. Assuming causal time regularity, more cycles can be employed in training, hence reducing the prediction error for the next cycle. This results in prediction of climate data with both naturally occurring as well as human forced CO2 values. We perform this detailed time and frequency analysis as a basis for improving the quality of our climate prediction methodologies, with particular attention to testing alternative hypotheses of the possible causes of climate change. These include the effect on albedo of suspended dust and increasing water vapor with temperature in initiating interglacial warming, the effect of temperature and pH values of surface water on ambient level of CO2 in the atmosphere and finding a larger latent heat capacity in the atmosphere required to sustain its circulatory motions, leading to friction and turbulent release of heat in boundary layer. All these potentials can be examined in an effective CPE

    Time and frequency analysis of Vostok ice core climate data

    Get PDF
    The periodicity of Vostok ice core climate temperature and gas concentration data indicate inherent long term past regularity of Earth’s climate, with a period of around 100,000 years, warming around 15,000 and cooling of around 85,000 years. At this point we are at the top of one of the warming periods. Vostok data cover around 430,000 years, ie 4 climate cycles (warming-cooling), of similar but not quite the same duration. In this paper we perform a detailed time and frequency analysis of these data for each of the cycles as well as their various combinations, including a full tested period of 430,000 years. Time correlation analysis allows for more accurate time lag estimate in each cycle already noted between temperature change and carbon dioxide content. We estimate these lags to lie between 1000-2500 years, longer than previously concluded. On the frequency side we perform Fast Fourier Analysis and identify full spectrum of harmonics for various cycles, and then perform energy analysis to identify which of the harmonics contributes the most. The idea is to reduce the computational load for further modeling and analysis using Kalman Filter based prediction method. Once the prediction model is defined (a follow up paper) data will be split into two segments, Learning and Testing, in preparation of a Machine Learning fine tunning methodology. We can use last three or last two or even just last cycle to learn on, and then the current on going cycle to test on. This will result in real time prediction of relevant climate data. Assuming causal time regularity, more of these cycles are employed in training, more the prediction error for the next cycle should be reduced. Hence it is critical to perform very detailed time and frequency analysis of Vostok data as a solid data base for the prediction model to follow

    Internet of things: Current technological review

    Get PDF
    This paper is a review of Internet of Things (IoT) with standards and industrial state-of-art approach. The purpose is to give insight into concept of “smart living”, a concept that meets requirements of today’s modern individuals and the society. Implementation of this new technology requires new hardware and software installed and run on devices (“things”) connected to the Internet anytime and anywhere. In order to make possible this new technology for wide use, few technological, standards and legal issues need to be solved. Several key companies (such as Intel, Cisco, IBM, etc.) are proposing their own standards both in HW and SW solutions and the time will tell which standard will emerge as a dominant one. Standards are the key for world wide acceptance of this new technology, as well as underlying wireless data technologies such as WiFi, ZigBe, and new emerging 4G and 5G mobile technologies
    corecore